[feat] completion api supports passing input token ids in either prompt
or prompt_token_ids
#3311
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
需求描述
Completion 接口需要支持在 prompt 字段直传 token ids 作为模型输入,与 vLLM 对齐。同时,在 FD v2.0.4 中新增的 prompt_token_ids 仍然有效,优先级暂定为 prompt_token_ids > prompt。另外,原来版本 prompt_token_ids 仅支持单条请求推理,现新增对批量推理的支持。
单条推理:
批量推理:
主要改动